Search results for "neural computation"
showing 10 items of 11 documents
2019
As rats learn to search for multiple sources of food or water in a complex environment, they generate increasingly efficient trajectories between reward sites. Such spatial navigation capacity involves the replay of hippocampal place-cells during awake states, generating small sequences of spatially related place-cell activity that we call "snippets". These snippets occur primarily during sharp-wave-ripples (SWRs). Here we focus on the role of such replay events, as the animal is learning a traveling salesperson task (TSP) across multiple trials. We hypothesize that snippet replay generates synthetic data that can substantially expand and restructure the experience available and make learni…
A stable brain from unstable components: Emerging concepts and implications for neural computation.
2017
Neuroscientists have often described the adult brain in similar terms to an electronic circuit board- dependent on fixed, precise connectivity. However, with the advent of technologies allowing chronic measurements of neural structure and function, the emerging picture is that neural networks undergo significant remodeling over multiple timescales, even in the absence of experimenter-induced learning or sensory perturbation. Here, we attempt to reconcile the parallel observations that critical brain functions are stably maintained, while synapse- and single-cell properties appear to be reformatted regularly throughout adult life. In this review, we discuss experimental evidence at multiple …
Relation between fixation disparity and the asymmetry between convergent and divergent disparity step responses
2007
Abstract The neural network model of Patel et al. [Patel, S. S., Jiang, B. C., & Ogmen, H. (2001). Vergence dynamics predict fixation disparity. Neural Computation, 13 (7), 1495–1525] predicts that fixation disparity, the vergence error for a stationary fusion stimulus, is the result of asymmetrical dynamic properties of disparity vergence mechanisms: faster (slower) convergent than divergent responses give rise to an eso (exo) fixation disparity, i.e., over-convergence (under-convergence) in stationary fixation. This hypothesis was tested in the present study with an inter-individual approach: in 16 subjects we estimated the vergence step response to a 1 deg disparity stimulus with a subje…
Why Cortices ? Neural Computation in the Vertebrate Visual System
1989
We propose three high level structural principles of neural networks in the vertebrate visual cortex and discuss some of their computational implications for early vision: a) Lamination, average axonal and dendritic domains, and intrinsic feedback determine the spatio-temporal interactions in cortical processing. Possible applications of the resulting filters include continuous motion perception and the direct measurement of high-level parameters of image flow, b) Retinotopic mapping is an emergent property of massively parallel connections. With a local intrinsic operation in the target area, mapping combines to a space-variant image processing system as would be useful in the analysis of …
Why Cortices? Neural Networks for Visual Information Processing
1989
Neural networks for the processing of sensory information show remarkable similarities between different species and across different sensory modalities. As an example, cortical organization found in the mamalian neopallium and in the optic tecta of most vertebrates appears to be equally appropriate as a substrate for visual, auditory, and somatosensory information processing. In this paper, we formulate three structural principles of the vertebrate visual cortex that allow to analyze structure and function of these neural networks on an intermediate level of complexity. Computational applications are taken from the field of early vision. The proposed principles are: (a) Average anatomy, i …
The Stability-Plasticity Dilemma: Investigating the Continuum from Catastrophic Forgetting to Age-Limited Learning Effects
2013
The stability-plasticity dilemma is a well-know constraint for artificial and biological neural systems. The basic idea is that learning in a parallel and distributed system requires plasticity for the integration of new knowledge, but also stability in order to prevent the forgetting of previous knowledge. Too much plasticity will result in previously encoded data being constantly forgotten, whereas too much stability will impede the efficient coding of this data at the level of the synapses. However, for the most part, neural computation has addressed the problems related to excessive plasticity or excessive stability as two different fields in the literature.
A Survey of Continuous-Time Computation Theory
1997
Motivated partly by the resurgence of neural computation research, and partly by advances in device technology, there has been a recent increase of interest in analog, continuous-time computation. However, while special-case algorithms and devices are being developed, relatively little work exists on the general theory of continuous- time models of computation. In this paper, we survey the existing models and results in this area, and point to some of the open research questions. Final Draft peerReviewed
Coarse scales are sufficient for efficient categorization of emotional facial expressions: Evidence from neural computation
2010
The human perceptual system performs rapid processing within the early visual system: low spatial frequency information is processed rapidly through magnocellular layers, whereas the parvocellular layers process all the spatial frequencies more slowly. The purpose of the present paper is to test the usefulness of low spatial frequency (LSF) information compared to high spatial frequency (HSF) and broad spatial frequency (BSF) visual stimuli in a classification task of emotional facial expressions (EFE) by artificial neural networks. The connectionist modeling results show that an LSF information provided by the frequency domain is sufficient for a distributed neural network to correctly cla…
Exponential Transients in Continuous-Time Symmetric Hopfield Nets
2001
We establish a fundamental result in the theory of continuous-time neural computation, by showing that so called continuous-time symmetric Hopfield nets, whose asymptotic convergence is always guaranteed by the existence of a Liapunov function may, in the worst case, possess a transient period that is exponential in the network size. The result stands in contrast to e.g. the use of such network models in combinatorial optimization applications. peerReviewed
Psychophysically Tuned Divisive Normalization Approximately Factorizes the PDF of Natural Images
2010
The conventional approach in computational neuroscience in favor of the efficient coding hypothesis goes from image statistics to perception. It has been argued that the behavior of the early stages of biological visual processing (e.g., spatial frequency analyzers and their nonlinearities) may be obtained from image samples and the efficient coding hypothesis using no psychophysical or physiological information. In this work we address the same issue in the opposite direction: from perception to image statistics. We show that psychophysically fitted image representation in V1 has appealing statistical properties, for example, approximate PDF factorization and substantial mutual informatio…